364 research outputs found

    How a Diverse Research Ecosystem Has Generated New Rehabilitation Technologies: Review of NIDILRR’s Rehabilitation Engineering Research Centers

    Get PDF
    Over 50 million United States citizens (1 in 6 people in the US) have a developmental, acquired, or degenerative disability. The average US citizen can expect to live 20% of his or her life with a disability. Rehabilitation technologies play a major role in improving the quality of life for people with a disability, yet widespread and highly challenging needs remain. Within the US, a major effort aimed at the creation and evaluation of rehabilitation technology has been the Rehabilitation Engineering Research Centers (RERCs) sponsored by the National Institute on Disability, Independent Living, and Rehabilitation Research. As envisioned at their conception by a panel of the National Academy of Science in 1970, these centers were intended to take a “total approach to rehabilitation”, combining medicine, engineering, and related science, to improve the quality of life of individuals with a disability. Here, we review the scope, achievements, and ongoing projects of an unbiased sample of 19 currently active or recently terminated RERCs. Specifically, for each center, we briefly explain the needs it targets, summarize key historical advances, identify emerging innovations, and consider future directions. Our assessment from this review is that the RERC program indeed involves a multidisciplinary approach, with 36 professional fields involved, although 70% of research and development staff are in engineering fields, 23% in clinical fields, and only 7% in basic science fields; significantly, 11% of the professional staff have a disability related to their research. We observe that the RERC program has substantially diversified the scope of its work since the 1970’s, addressing more types of disabilities using more technologies, and, in particular, often now focusing on information technologies. RERC work also now often views users as integrated into an interdependent society through technologies that both people with and without disabilities co-use (such as the internet, wireless communication, and architecture). In addition, RERC research has evolved to view users as able at improving outcomes through learning, exercise, and plasticity (rather than being static), which can be optimally timed. We provide examples of rehabilitation technology innovation produced by the RERCs that illustrate this increasingly diversifying scope and evolving perspective. We conclude by discussing growth opportunities and possible future directions of the RERC program

    Secondary contact and admixture between independently invading populations of the Western corn rootworm, diabrotica virgifera virgifera in Europe

    Get PDF
    The western corn rootworm, Diabrotica virgifera virgifera (Coleoptera: Chrysomelidae), is one of the most destructive pests of corn in North America and is currently invading Europe. The two major invasive outbreaks of rootworm in Europe have occurred, in North-West Italy and in Central and South-Eastern Europe. These two outbreaks originated from independent introductions from North America. Secondary contact probably occurred in North Italy between these two outbreaks, in 2008. We used 13 microsatellite markers to conduct a population genetics study, to demonstrate that this geographic contact resulted in a zone of admixture in the Italian region of Veneto. We show that i) genetic variation is greater in the contact zone than in the parental outbreaks; ii) several signs of admixture were detected in some Venetian samples, in a Bayesian analysis of the population structure and in an approximate Bayesian computation analysis of historical scenarios and, finally, iii) allelic frequency clines were observed at microsatellite loci. The contact between the invasive outbreaks in North-West Italy and Central and South-Eastern Europe resulted in a zone of admixture, with particular characteristics. The evolutionary implications of the existence of a zone of admixture in Northern Italy and their possible impact on the invasion success of the western corn rootworm are discussed

    A comparative evaluation of the effect of internet-based CME delivery format on satisfaction, knowledge and confidence

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Internet-based instruction in continuing medical education (CME) has been associated with favorable outcomes. However, more direct comparative studies of different Internet-based interventions, instructional methods, presentation formats, and approaches to implementation are needed. The purpose of this study was to conduct a comparative evaluation of two Internet-based CME delivery formats and the effect on satisfaction, knowledge and confidence outcomes.</p> <p>Methods</p> <p>Evaluative outcomes of two differing formats of an Internet-based CME course with identical subject matter were compared. A Scheduled Group Learning format involved case-based asynchronous discussions with peers and a facilitator over a scheduled 3-week delivery period. An eCME On Demand format did not include facilitated discussion and was not based on a schedule; participants could start and finish at any time. A retrospective, pre-post evaluation study design comparing identical satisfaction, knowledge and confidence outcome measures was conducted.</p> <p>Results</p> <p>Participants in the Scheduled Group Learning format reported significantly higher mean satisfaction ratings in some areas, performed significantly higher on a post-knowledge assessment and reported significantly higher post-confidence scores than participants in the eCME On Demand format that was not scheduled and did not include facilitated discussion activity.</p> <p>Conclusions</p> <p>The findings support the instructional benefits of a scheduled delivery format and facilitated asynchronous discussion in Internet-based CME.</p

    The effect of socioeconomic status on three-year mortality after first-ever ischemic stroke in Nanjing, China

    Get PDF
    BACKGROUND: Low socioeconomic status (SES) is associated with increased mortality after stroke in developed countries. This study was performed to determine whether a similar association also exists in China. METHODS: A total of 806 patients with first-ever ischemic stroke were enrolled in our study. From August 1999 to August 2005, the three-year all-cause mortality following the stroke was determined. Level of education, occupation, taxable income and housing space were used as indicators for SES. Stepwise univariate and multivariate COX proportional hazards models were used to study the association between the SES measures and the three-year mortality. RESULTS: Our analyses confirmed that occupation, taxable income and housing space were significantly associated with three-year mortality after first-ever stroke. Manual workers had a significant hazard ratio of 5.44 (95% CI 2.75 to 10.77) for death within three years when compared with non-manual workers. Those in the zero income group had a significant hazard ratio of 5.35 (95% CI 2.95 to 9.70) and those in the intermediate income group 2.10 (95% CI 1.24 to 3.58) when compared with those in the highest income group. Those in two of the three groups with the smallest housing space also had significant hazard ratios of 2.06 (95% CI 1.16 to 3.65) and 1.68 (95% CI 1.12 to 2.52) when compared with those in group with the largest housing space. These hazard ratios remained largely unchanged after multivariate adjustment for age, gender, baseline cardiovascular disease risk factors, and stroke severity. The analyses did not confirm an association with educational level. CONCLUSION: Lower SES has a negative impact on the outcome of first-ever stroke in Nanjing, China. This confirms the need to improve preventive and secondary care for stroke among low SES groups

    The "Clubs against Drugs" program in Stockholm, Sweden: two cross-sectional surveys examining drug use among staff at licensed premises

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The objective of this study is to examine self-reported drug use among staff at licensed premises, types of drugs used, attitudes towards drugs, and observed drug use among guests. Results are presented from two measurement points (in 2001 and 2007/08). This study was carried out within the framework of the "Clubs against Drugs" program, which is a community-based multi-component intervention targeting licensed premises in Stockholm, Sweden.</p> <p>Methods</p> <p>Two cross-sectional surveys were conducted, the first in 2001 and the second in 2007/08. Staff at licensed premises attending server training were asked to participate in the anonymous survey. A survey was administered in a classroom setting and consisted of four sections: 1) demographics, 2) respondents' own drug use experience, 3) respondents' attitudes towards drug use, and 4) observed drug use among guests at licensed premises.</p> <p>Results</p> <p>Data were collected from 446 staff in 2001 and 677 staff in 2007/08. The four most commonly used drugs among staff were cannabis, cocaine, amphetamine, and ecstasy. The highest rates of drug use were reported by staff in the two youngest age groups, i.e., those younger than 25 and those between the ages of 25 and 29. In 2007/08 staff reported significantly lower rates of drug use than staff in 2001. Last year drug use for the sample in 2007/08 was 19% compared to 27% for the 2001 sample. While drug-using staff compared to non drug-using staff reported more observations of drug use among guests, they were less inclined to intervene. Overall, staff reported restrictive attitudes towards drugs.</p> <p>Conclusions</p> <p>The prevalence of life-time and last year drug use among staff at licensed premises is high compared to the general population in Sweden. Lower rates of self-reported drug use among staff were reported in 2007/08. The results of this study highlight that staff at licensed premises represent an important target population in club drug prevention programs.</p

    Usefulness of C-reactive protein as a marker of early post-infarct left ventricular systolic dysfunction

    Get PDF
    Objective To assess the usefulness of in-hospital measurement of C-reactive protein (CRP) concentration in comparison to well-established risk factors as a marker of post-infarct left ventricular systolic dysfunction (LVSD) at discharge. Materials and methods Two hundred and four consecutive patients with ST-segment-elevation myocardial infarction (STEMI) were prospectively enrolled into the study. CRP plasma concentrations were measured before reperfusion, 24 h after admission and at discharge with an ultra-sensitive latex immunoassay. Results CRP concentration increased significantly during the first 24 h of hospitalization (2.4 ± 1.9 vs. 15.7 ± 17.0 mg/L; p\0.001) and persisted elevated at discharge (14.7 ± 14.7 mg/L), mainly in 57 patients with LVSD (2.4 ± 1.8 vs. 25.0 ± 23.4 mg/L; p\0.001; CRP at discharge 21.9 ± 18.6 mg/L). The prevalence of LVSD was significantly increased across increasing tertiles of CRP concentration both at 24 h after admission (13.2 vs. 19.1 vs. 51.5 %; p\0.0001) and at discharge (14.7 vs. 23.5 vs. 45.6 %; p\0.0001). Multivariate analysis demonstrated CRP concentration at discharge to be an independent marker of early LVSD (odds ratio of 1.38 for a 10 mg/L increase, 95 % confidence interval 1.01–1.87; p\0.04). Conclusion Measurement of CRP plasma concentration at discharge may be useful as a marker of early LVSD in patients after a first STEMI

    Understanding the context of balanced scorecard implementation: a hospital-based case study in pakistan

    Get PDF
    Background: As a response to a changing operating environment, healthcare administrators are implementing modern management tools in their organizations. The balanced scorecard (BSC) is considered a viable tool in high-income countries to improve hospital performance. The BSC has not been applied to hospital settings in low-income countries nor has the context for implementation been examined. This study explored contextual perspectives in relation to BSC implementation in a Pakistani hospital. Methods: Four clinical units of this hospital were involved in the BSC implementation based on their willingness to participate. Implementation included sensitization of units towards the BSC, developing specialty specific BSCs and reporting of performance based on the BSC during administrative meetings. Pettigrew and Whipp\u27s context (why), process (how) and content (what) framework of strategic change was used to guide data collection and analysis. Data collection methods included quantitative tools (a validated culture assessment questionnaire) and qualitative approaches including key informant interviews and participant observation.Results: Method triangulation provided common and contrasting results between the four units. A participatory culture, supportive leadership, financial and non-financial incentives, the presentation of clear direction by integrating support for the BSC in policies, resources, and routine activities emerged as desirable attributes for BSC implementation. The two units that lagged behind were more involved in direct inpatient care and carried a considerable clinical workload. Role clarification and consensus about the purpose and benefits of the BSC were noted as key strategies for overcoming implementation challenges in two clinical units that were relatively ahead in BSC implementation. It was noted that, rather than seeking to replace existing information systems, initiatives such as the BSC could be readily adopted if they are built on existing infrastructures and data networks. Conclusion: Variable levels of the BSC implementation were observed in this study. Those intending to apply the BSC in other hospital settings need to ensure a participatory culture, clear institutional mandate, appropriate leadership support, proper reward and recognition system, and sensitization to BSC benefits

    Multicentre phase II studies evaluating imatinib plus hydroxyurea in patients with progressive glioblastoma

    Get PDF
    Contains fulltext : 79699.pdf (publisher's version ) (Closed access)BACKGROUND: We evaluated the efficacy of imatinib mesylate in addition to hydroxyurea in patients with recurrent glioblastoma (GBM) who were either on or not on enzyme-inducing anti-epileptic drugs (EIAEDs). METHODS: A total of 231 patients with GBM at first recurrence from 21 institutions in 10 countries were enrolled. All patients received 500 mg of hydroxyurea twice a day. Imatinib was administered at 600 mg per day for patients not on EIAEDs and at 500 mg twice a day if on EIAEDs. The primary end point was radiographic response rate and secondary end points were safety, progression-free survival at 6 months (PFS-6), and overall survival (OS). RESULTS: The radiographic response rate after centralised review was 3.4%. Progression-free survival at 6 months and median OS were 10.6% and 26.0 weeks, respectively. Outcome did not appear to differ based on EIAED status. The most common grade 3 or greater adverse events were fatigue (7%), neutropaenia (7%), and thrombocytopaenia (7%). CONCLUSIONS: Imatinib in addition to hydroxyurea was well tolerated among patients with recurrent GBM but did not show clinically meaningful anti-tumour activity
    corecore